In-HIT Example-Guided Annotation Aid for Crowdsourcing UI Components

نویسندگان

  • Yi-Ching Huang
  • Chun-I Wang
  • Shih-Yuan Yu
  • Jane Yung-jen Hsu
چکیده

This paper presents an approach to crowdsourcing annotations of UI components from images. Using the “Find-Draw-Verify” task design, an in-HIT exampleguided annotation aid is proposed to facilitate workers thereby improving the result quality.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

NEAT: News Exploration Along Time

There are a number of efforts towards building applications that leverage temporal information in documents. The demonstration of our NEAT (News Exploration Along Time) prototype system that we propose here, is an attempt towards building an intuitive and exploratory interface for search results over large news archives using timelines. The demonstration uses the New York Times Annotated Corpus...

متن کامل

Empirically-derived Methodology for Crowdsourcing Ground Truth

The main challenge for cognitive computing systems, and specifically for their natural language processing, video and image analysis components, is to be provided with large amounts of training and evaluation data. The traditional process for gathering ground truth data is lengthy, costly, and time consuming: (i) expert annotators are not always available; (ii) automated methods generate data w...

متن کامل

Scaling drug indication curation through crowdsourcing

Motivated by the high cost of human curation of biological databases, there is an increasing interest in using computational approaches to assist human curators and accelerate the manual curation process. Towards the goal of cataloging drug indications from FDA drug labels, we recently developed LabeledIn, a human-curated drug indication resource for 250 clinical drugs. Its development required...

متن کامل

A Methodology for Corpus Annotation through Crowdsourcing

In contrast to expert-based annotation, for which elaborate methodologies ensure high quality output, currently no systematic guidelines exist for crowdsourcing annotated corpora, despite the increasing popularity of this approach. To address this gap, we define a crowd-based annotation methodology, compare it against the OntoNotes methodology for expert-based annotation, and identify future ch...

متن کامل

Modelling Crowdsourcing Originated Keywords within the Athletics Domain

Image classification arises as an important phase in the overall process of automatic image annotation and image retrieval. Usually, a set of manually annotated images is used to train supervised systems and classify images into classes. The act of crowdsourcing has largely focused on investigating strategies for reducing the time, cost and effort required for the creation of the annotated data...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013